106 research outputs found

    The unreasonable effectiveness of the final batch normalization layer

    Get PDF
    Early-stage disease indications are rarely recorded in real-world domains, such as Agriculture and Healthcare, and yet, their accurate identification is critical in that point of time. In this type of highly imbalanced classification problems, which encompass complex features, deep learning (DL) is much needed because of its strong detection capabilities. At the same time, DL is observed in practice to favor majority over minority classes and consequently suffer from inaccurate detection of the targeted early-stage indications. In this work, we extend the study done by  [11], showing that the final BN layer, when placed before the softmax output layer, has a considerable impact in highly imbalanced image classification problems as well as undermines the role of the softmax outputs as an uncertainty measure. This current study addresses additional hypotheses and reports on the following findings: (i) the performance gain after adding the final BN layer in highly imbalanced settings could still be achieved after removing this additional BN layer in inference; (ii) there is a certain threshold for the imbalance ratio upon which the progress gained by the final BN layer reaches its peak; (iii) the batch size also plays a role and affects the outcome of the final BN application; (iv) the impact of the BN application is also reproducible on other datasets and when utilizing much simpler neural architectures; (v) the reported BN effect occurs only per a single majority class and multiple minority classes – i.e., no improvements are evident when there are two majority classes; and finally, (vi) utilizing this BN layer with sigmoid activation has almost no impact when dealing with a strongly imbalanced image classification tasks.Algorithms and the Foundations of Software technolog

    Towards Self-Adaptive Efficient Global Optimization

    Get PDF
    Algorithms and the Foundations of Software technolog

    The significance of bug report elements

    Get PDF
    Algorithms and the Foundations of Software technolog

    Neural network design: learning from Neural Architecture Search

    Get PDF
    Neural Architecture Search (NAS) aims to optimize deep neural networks' architecture for better accuracy or smaller computational cost and has recently gained more research interests. Despite various successful approaches proposed to solve the NAS task, the landscape of it, along with its properties, are rarely investigated. In this paper, we argue for the necessity of studying the landscape property thereof and propose to use the so-called Exploratory Landscape Analysis (ELA) techniques for this goal. Taking a broad set of designs of the deep convolutional network, we conduct extensive experimentation to obtain their performance. Based on our analysis of the experimental results, we observed high similarities between well-performing architecture designs, which is then used to significantly narrow the search space to improve the efficiency of any NAS algorithm. Moreover, we extract the ELA features over the NAS landscapes on three common image classification data sets, MNIST, Fashion, and CIFAR-10, which shows that the NAS landscape can be distinguished for those three data sets. Also, when comparing to the ELA features of the well-known Black-Box optimization Benchmarking (BBOB) problem set, we found out that the NAS landscapes surprisingly form a new problem class on its own, which can be separated from all 24 BBOB problems. Given this interesting observation, we, therefore, state the importance of further investigation on selecting an efficient optimizer for the NAS landscape as well as the necessity of augmenting the current benchmark problem set.Algorithms and the Foundations of Software technolog

    Requirements towards optimizing analytics in industrial processes

    Get PDF
    Algorithms and the Foundations of Software technolog

    Explorative data analysis of time series based algorithm features of CMA-ES variants

    Get PDF
    Algorithms and the Foundations of Software technolog

    Ship design performance and cost optimization with machine learning

    Get PDF
    This contribution shows how, in the preliminary design stage, naval architects can make more informed decisions by using machine learning. In this ship design phase, little information is available, and decisions need to be made in a limited amount of time. However, it is in the preliminary design phase where the most influential decisions are made regarding the global dimensions, the machinery, and therefore the performance and costs. In this paper it is shown that a machine learning algorithm trained with data from reference vessels are more accurate when estimating key performance indicators compared to existing empirical design formulas. Finally, the combination of the trained models with optimization algorithms shows to be a powerful tool for finding Pareto-optimal designs from which the naval architect can learn.Algorithms and the Foundations of Software technolog

    Unsupervised strategies for identifying optimal parameters in Quantum Approximate Optimization Algorithm

    Get PDF
    As combinatorial optimization is one of the main quantum computing applications, many methods based on parameterized quantum circuits are being developed. In general, a set of parameters are being tweaked to optimize a cost function out of the quantum circuit output. One of these algorithms, the Quantum Approximate Optimization Algorithm stands out as a promising approach to tackling combinatorial problems. However, finding the appropriate parameters is a difficult task. Although QAOA exhibits concentration properties, they can depend on instances characteristics that may not be easy to identify, but may nonetheless offer useful information to find good parameters. In this work, we study unsupervised Machine Learning approaches for setting these parameters without optimization. We perform clustering with the angle values but also instances encodings (using instance features or the output of a variational graph autoencoder), and compare different approaches. These angle-finding strategies can be used to reduce calls to quantum circuits when leveraging QAOA as a subroutine. We showcase them within Recursive-QAOA up to depth 3 where the number of QAOA parameters used per iteration is limited to 3, achieving a median approximation ratio of 0.94 for MaxCut over 200 Erdős-Rényi graphs. We obtain similar performances to the case where we extensively optimize the angles, hence saving numerous circuit calls.Algorithms and the Foundations of Software technolog

    Learning adaptive differential evolution algorithm from optimization experiences by policy gradient

    Get PDF
    Algorithms and the Foundations of Software technolog

    One-shot optimization for vehicle dynamics control systems: towards benchmarking and exploratory landscape analysis

    Get PDF
    Algorithms and the Foundations of Software technolog
    • …
    corecore